Tencent Launches Embodied Multimodal Large Model HY-Embodied-0.5-X to Empower Robot Intelligent Interaction
Tencent Robotics X and the Hunyuan team jointly open-source the HY-Embodied-0.5-X multimodal large model, optimized for embodied tasks of robots. This model is based on the MoT-2B architecture, enhancing the ability to 'understand, clarify, and act.' It excels in fine manipulation, spatial reasoning, action prediction, and risk assessment. The series includes two versions: MoT-2B and MoE-32B, aiming to improve robots' intelligent interaction in real-world environments.